Constructing deterministic finite-state automata in recurrent neural networks
نویسندگان
چکیده
منابع مشابه
Constructing Deterministic Finite-State Automata in Recurrent Neural Networksy
Recurrent neural networks that are trained to behave like deterministic nite-state automata (DFA's) can show deteriorating performance when tested on long strings. This deteriorating performance can be attributed to the instability of the internal representation of the learned DFA states. The use of a sigmoidal discriminant function together with the recurrent structure contribute to this insta...
متن کاملRecurrent Neural Networks Learn Deterministic Representations of Fuzzy Finite-State Automata
The paradigm of deterministic nite-state automata (DFAs) and their corresponding regular languages have been shown to be very useful for addressing fundamental issues in recurrent neural networks. The issues that have been addressed include knowledge representation, extraction, and reenement as well development of advanced learning algorithms. Recurrent neural networks are also very promising t...
متن کاملInjecting Nondeterministic Finite State Automata into Recurrent Neural Networks
In this paper we propose a method for injecting time-warping nondeterministic nite state automata into recurrent neural networks. The proposed algorithm takes as input a set of automata transition rules and produces a recurrent architecture. The resulting connection weights are speciied by means of linear constraints. In this way, the network is guaranteed to carry out the assigned automata rul...
متن کاملFinite State Automata and Simple Recurrent Networks
We explore a network architecture introduced by Elman (1988) for predicting successive elements of a sequence. The network uses the pattern of activation over a set of hidden units from time-step t-1, together with element t, to predict element t + 1. When the network is trained with strings from a particular finite-state grammar, it can learn to be a perfect finite-state recognizer for the gra...
متن کاملRecurrent Neural Networks and Finite Automata
This article studies finite size networks that consist of interconnections of synchronously evolving processors. Each processor updates its state by applying an activation function lo a linear combination of the previous states of all units. We prove that any function for which the left and right limits exist and are different can be applied to the neurons to yield a network which is at least a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of the ACM
سال: 1996
ISSN: 0004-5411,1557-735X
DOI: 10.1145/235809.235811